Guiding Backprop by Inserting Rules

نویسندگان

  • Sebastian Bader
  • Nuno C. Marques
چکیده

We report on an experiment where we inserted symbolic rules into a neural network during the training process. This was done to guide the learning and to help escape local minima. The rules are constructed by analysing the errors made by the network after training. This process can be repeated, which allows to improve the network performance again and again. We propose a general framework and provide a proof of concept of the usefullness of our approach.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Obturator Guiding Technique in Percutaneous Endoscopic Lumbar Discectomy

In conventional percutaneous disc surgery, introducing instruments into disc space starts by inserting a guide needle into the triangular working zone. However, landing the guide needle tip on the annular window is a challenging step in endoscopic discectomy. Surgeons tend to repeat the needling procedure to reach an optimal position on the annular target. Obturator guiding technique is a modif...

متن کامل

Learning Many Related Tasks at the Same Time with Backpropagation

Hinton [6] proposed that generalization in artificial neural nets should improve if nets learn to represent the domain's underlying regularities . Abu-Mustafa's hints work [1] shows that the outputs of a backprop net can be used as inputs through which domainspecific information can be given to the net . We extend these ideas by showing that a backprop net learning many related tasks at the sam...

متن کامل

Backprop as Functor: A compositional perspective on supervised learning

A supervised learning algorithm searches over a set of functions A → B parametrisedby a spaceP to find the best approximation to some ideal function f : A → B. It does this by taking examples (a, f (a)) ∈ A×B, andupdating the parameter according to some rule. We define a categorywhere these update rules may be composed, and show that gradient descent—with respect to a fixed step size and an err...

متن کامل

2 MECHANISMS OF MULTITASK BACKPROPWe

Hinton 6] proposed that generalization in artiicial neural nets should improve if nets learn to represent the domain's underlying regularities. Abu-Mustafa's hints work 1] shows that the outputs of a backprop net can be used as inputs through which domain-speciic information can be given to the net. We extend these ideas by showing that a backprop net learning many related tasks at the same tim...

متن کامل

Carotid-compression technique for the insertion of guiding catheters.

Inserting a guiding catheter into a tortuous artery for neurointerventional procedures can be difficult. In our technique, the carotid artery is manually compressed to stabilize and/or straighten the inserted wire before advancing the guiding catheter. Although this technique is not risk-free and care must be taken to avoid vascular injury by excessive compression, it is useful for the insertio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008